Symmetries and discriminability in feedforward network architectures

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Symmetries and discriminability in feedforward network architectures

This paper investigates the effects of introducing symmetries into feedforward neural networks in what are termed symmetry networks. This technique allows more efficient training for problems in which we require the output of a network to be invariant under a set of transformations of the input. The particular problem of graph recognition is considered. In this case the network is designed to d...

متن کامل

Feedforward Approximations to Dynamic Recurrent Network Architectures

Recurrent neural network architectures can have useful computational properties, with complex temporal dynamics and input-sensitive attractor states. However, evaluation of recurrent dynamic architectures requires solving systems of differential equations, and the number of evaluations required to determine their response to a given input can vary with the input or can be indeterminate altogeth...

متن کامل

Feedforward Neural Network Architectures for Complex Classi cation Problems

This paper presents two neural network design strategies for incorporating a pri-ori knowledge about a given problem into the feedforward neural networks. These strategies aim at obtaining tractability and reliability for solving complex classiica-tion problems by neural networks. The rst type strategy based on multistage scheme decomposes the problem into manageable ones for reducing the compl...

متن کامل

Optimal Modular Feedforward Neural Nets Based on Functional Network Architectures

Functional networks combine both domain and data knowledge to develop optimal network architectures for some types of interesting problems. The topology of the network is obtained from qualitative domain knowledge, and data is used to fit the processing functions appearing in the network; these functions are supposed to be linear combinations of known functions from appropriate families. In thi...

متن کامل

Generalized neuron: Feedforward and recurrent architectures

Feedforward neural networks such as multilayer perceptrons (MLP) and recurrent neural networks are widely used for pattern classification, nonlinear function approximation, density estimation and time series prediction. A large number of neurons are usually required to perform these tasks accurately, which makes the MLPs less attractive for computational implementations on resource constrained ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Neural Networks

سال: 1993

ISSN: 1045-9227

DOI: 10.1109/72.248459